The joint cdf \(F(x,y) = P(X \le x, Y \le y)\) can be calculated: \[F(x,y) = \int\limits_{-\infty}^y\!\int\limits_{-\infty}^x f(u,v)\,du\,dv\]
It is a mystery of multivariable calculus how to obtain \(f\) from \(F\)
Last edited: 2016-11-01 22:15
The joint cdf \(F(x,y) = P(X \le x, Y \le y)\) can be calculated: \[F(x,y) = \int\limits_{-\infty}^y\!\int\limits_{-\infty}^x f(u,v)\,du\,dv\]
It is a mystery of multivariable calculus how to obtain \(f\) from \(F\)
Maybe you got this far in your co-requisite!
With a function \(g(x,y)\) you can take the derivative with respect to one variable at a time, holding the other variable constant. Notation: \[\frac{\partial}{\partial x}g(x,y) \quad \text{ and } \quad \frac{\partial}{\partial y}g(x,y).\]
When \(g\) is "smooth" you get the nice result: \[\frac{\partial}{\partial y}\left[\frac{\partial}{\partial x}g(x,y)\right]= \frac{\partial}{\partial x}\left[\frac{\partial}{\partial y}g(x,y)\right],\] and we just call this: \[\frac{\partial^2}{\partial x\partial y}g(x,y).\]
Just take all the "partial" derivatives in any order you like.
\[\frac{\partial^2}{\partial x\partial y}F(x,y) = f(x,y)\]
"Proof: …"
Examples can be challenging! Consider \(f(x,y) = xy\) on \(0<x<1\), \(0<y<2\) (…to be revisited…)
Just like in the discrete case we can recover information about \(X\) and \(Y\) individually by "integrating out" the other variable. The marginal densities are: \[{f_{\tiny{X}}}(x) = \int\limits_{-\infty}^\infty f(x,y)\,dy\\ {f_{\tiny{Y}}}(y) = \int\limits_{-\infty}^\infty f(x,y)\,dx\]
What about the marginal cdfs?
Continue the example on the previous slide.
Example D from the book.
\[f(x,y) = \begin{cases} \lambda^2\exp(-\lambda y) &: 0 \le x \le y,\, \lambda > 0\\ 0 &: \text{ otherwise.} \end{cases}\]
Exercise: review Example E from the book "Bivariate Normal". We will revisit this example.
Events \(A\) and \(B\) are independent if \(P(A\cap B) = P(A)P(B)\).
\(A \perp B \iff A^c \perp B \iff A \perp B^c \iff A^c \perp B^c\)
"Experiments" \(\mathcal{E}_A = \{A_1, A_2, \ldots\}\) and \(\mathcal{E}_B = \{B_1, B_2, \ldots\}\) are independent if \(A_i \perp B_j\) for all \(i,\,j\).
Definition: Random variables \(X\) and \(Y\) are independent if: \[P(X \in A, Y \in B) = P(X \in A)P(Y \in B)\] for any* \(A, B \subset \mathbb{R}\).
Theorem: \(X \perp Y\) if and only if the joint cdf \(F(x,y) = {F_{\tiny{X}}}(x){F_{\tiny{Y}}}(y)\) is the product of the marginal cdfs.
Proof: \(\Longleftarrow\) ("only if") too hard; \(\Longrightarrow\) left as exercise.
Corollary: \(X \perp Y\) if and only if the joint \(f(x,y) = {f_{\tiny{X}}}(x){f_{\tiny{Y}}}(y)\)
To verify, in practice check two things:
\(f(x, y) = xy\) on \(0 < x < 1\) and \(0 < y < 2\).
\(f(x, y) = \lambda^2\exp(-\lambda y)\) on \(0 < x < y < \infty\).
\(f(x, y) = \frac{1}{2}\lambda^3y\exp(-\lambda(x+y))\) on \(x > 0\) and \(y > 0\).